THE VC - DIMENSION 3 8 . 2 . 2 Definitions

نویسندگان

  • Yishay Mansour
  • Ben Klein
  • Tal Kaminker
  • Eli Daian
  • Itamar Eskin
  • Yonatan Dinai
چکیده

In the PAC Model we assume there exists a distribution D on the examples X that the learner receives, i.e., when choosing an instance from the sample it is drawn according to D. We assume that D is: 1. Fixed throughout the learning process. 2. Unknown to the learner. 3. The instances are chosen independently. The target concept is specified as a computable function c, thus our instances are of the form . Our goal is to find a function h which approximates c with respect to D, in the following sense. Let error(h) = ProbD [c (x) 6= h(x)] . We would like to ensure that error(h) is below a certain threshold ε, which is given as a parameter to the algorithm. This parameter is a measure of the accuracy of the learning process. As a measure of our confidence in the outcome of the learning process, we add another parameter δ. We require that the following hold:

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

VC-Dimension and Shortest Path Algorithms

We explore the relationship between VC-dimension and graph algorithm design. In particular, we show that set systems induced by sets of vertices on shortest paths have VC-dimension at most two. This allows us to use a result from learning theory to improve time bounds on query algorithms for the point-to-point shortest path problem in networks of low highway dimension, such as road networks. We...

متن کامل

Leanability with VC-dimensions

By the above definitions we know, ΠF (n) ≤ |F| ΠF (n) ≤ |2| Definition 1.2 (Shattering). A hypothesis class F shatters a finite set S ⊂ X , iff |SF (S)| = 2|S| In an informal language, shattering means that you have to be able to separate all +/labelings of the same set of points, given the function class. We want to bound Rademacher average using a function of VC-dimension. Lemma 1.1. Let F be...

متن کامل

Introduction to O-minimal Structures and an Application to Neural Network Learning

1. Definition of o-minimality and first examples 1 2. Basic real analysis and the monotonicity theorem 4 3. Definable Skolem functions and elimination of imaginaries 8 4. Dimension, part 1 10 5. Cell decomposition 11 6. Dimension, part 2 15 7. Restricted analytic functions and global exponentiation 17 8. NIP and neural networks 20 8.1. Vapnik-Chervonenkis dimension 20 8.2. O-minimal structures ...

متن کامل

Correction to 'Lower Bounds on the VC-Dimension of Smoothly Parametrized Function Classes'

The paper 3] gives lower bounds on the VC-dimension of various smoothly parametrized function classes. The results were proven by showing a relationship between the uniqueness of decision boundaries and the VC-dimension of smoothly parametrized function classes. The proof is incorrect; there is no such relationship under the conditions stated in 3]. For the case of neural networks with tanh act...

متن کامل

VC dimension of ellipsoids

We will establish that the vc dimension of the class of d-dimensional ellipsoids is (d +3d)/2, and that maximum likelihood estimate with N -component d-dimensional Gaussian mixture models induces a geometric class having vc dimension at least N(d + 3d)/2.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012